Sequential minimal optimization for SVM with pinball loss

نویسندگان

  • Xiaolin Huang
  • Lei Shi
  • Johan A. K. Suykens
چکیده

To pursue the insensitivity to feature noise and the stability to re-sampling, a new type of support vector machine (SVM) has been established via replacing the hinge loss in the classical SVM by the pinball loss and was hence called a pin-SVM. Though a different loss function is used, pin-SVM has a similar structure as the classical SVM. Specifically, the dual problem of pin-SVM is a quadratic programming problem with box constraints, for which the sequential minimal optimization (SMO) technique is applicable. In this paper, we establish SMO algorithms for pin-SVM and its sparse version. The numerical experiments on real-life data sets illustrate both the good performance of pin-SVMs and the effectiveness of the established SMO methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Ellipsoidal Support Vector Machines

This paper proposes the ellipsoidal SVM (e-SVM) that uses an ellipsoid center, in the version space, to approximate the Bayes point. Since SVM approximates it by a sphere center, e-SVM provides an extension to SVM for better approximation of the Bayes point. Although the idea has been mentioned before (Ruján (1997)), no work has been done for formulating and kernelizing the method. Starting fro...

متن کامل

A Unifying Framework in Vector-valued Reproducing Kernel Hilbert Spaces for Manifold Regularization and Co-Regularized Multi-view Learning

This paper presents a general vector-valued reproducing kernel Hilbert spaces (RKHS) framework for the problem of learning an unknown functional dependency between a structured input space and a structured output space. Our formulation encompasses both Vector-valued Manifold Regularization and Co-regularized Multi-view Learning, providing in particular a unifying framework linking these two imp...

متن کامل

Fast Kernel Learning using Sequential Minimal Optimization

While classical kernel-based classifiers are based on a single kernel, in practice it is often desirable to base classifiers on combinations of multiple kernels. Lanckriet et al. (2004) considered conic combinations of kernel matrices for the support vector machine (SVM), and showed that the optimization of the coefficients of such a combination reduces to a convex optimization problem known as...

متن کامل

New í-Support Vector Machines and their Sequential Minimal Optimization

Although the v-Support Vector Machine, v-SVM, (SchSlkopf et al., 2000) has the advantage of using a single parameter v to control both the number of support vectors and the fraction of margin errors, there are two issues that prevent it from being used in many real world applications. First, unlike the C-SVM that allows asymmetric misclassification cost, v-SVM uses a symmetric misclassification...

متن کامل

Research on SVM Algorithm with Particle Swarm Optimization

Support Vector Machines (SVM) is a practical algorithm that has been widely used in many areas. To guarantee its satisfying performance, it is important to set appropriate parameters of SVM algorithm. Sequential Minimal Optimization (SMO) is an effective training algorithm belonging to SVM, i.e.LS_SVM. Therefore, on the basis of the SMO algorithm and LS_SVM, which integrates SMO algorithm and L...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neurocomputing

دوره 149  شماره 

صفحات  -

تاریخ انتشار 2015